A Comparative Framework for Preconditioned Lasso Algorithms
ثبت نشده
چکیده
Consider the SVD X = UDV >, where U is n× n, V is p× p and D is an n× p “diagonal” matrix with entries d1 < . . . < dn. Define two groups of left and right singular vectors associated with the q smallest and n − q largest singular values. Let the groups be defined by Uq, Un−q and Vq, Vn−q . Suppose HJ chooses as its row-basis the n−q largest right singular vectors, Vn−q . Then, from Table 1 of Huang and Jojic [1] we find that Z = XVn−q = Un−qdiag({dj}j>q) (1) X̄ = R = X − ZV > n−q (2) = X − Un−qdiag({dj}j>q)V > n−q (3)
منابع مشابه
A Comparative Framework for Preconditioned Lasso Algorithms
The Lasso is a cornerstone of modern multivariate data analysis, yet its performance suffers in the common situation in which covariates are correlated. This limitation has led to a growing number of Preconditioned Lasso algorithms that pre-multiply X and y by matrices PX , Py prior to running the standard Lasso. A direct comparison of these and similar Lasso-style algorithms to the original La...
متن کاملPerformance of Tunisian Public Hospitals: A Comparative Assessment Using the Pabón Lasso Model
Background and Objectives: Constant monitoring of healthcare organizations’ performance is an integral part of informed health policy-making. Several hospital performance assessment methods have been proposed in the literature. Pabon Lasso Model offers a fast and convenient method for comparative evaluation of hospital performance. This study aimed to evaluate the relative performance of hospit...
متن کامل“preconditioning” for Feature Selection and Regression in High-dimensional Problems1 by Debashis Paul,
We consider regression problems where the number of predictors greatly exceeds the number of observations. We propose a method for variable selection that first estimates the regression function, yielding a “preconditioned” response variable. The primary method used for this initial regression is supervised principal components. Then we apply a standard procedure such as forward stepwise select...
متن کاملStability Analysis of LASSO and Dantzig Selector via Constrained Minimal Singular Value of Gaussian Sensing Matrices
In this paper, we introduce a new framework for interpreting the existing theoretical stability results of sparse signal recovery algorithms in practical terms. Our framework is built on the theory of constrained minimal singular values of Gaussian sensing matrices. Adopting our framework, we study the stability of two algorithms, namely LASSO and Dantzig selector. We demonstrate that for a giv...
متن کاملA Unified Robust Regression Model for Lasso-like Algorithms
We develop a unified robust linear regression model and show that it is equivalent to a general regularization framework to encourage sparse-like structure that contains group Lasso and fused Lasso as specific examples. This provides a robustness interpretation of these widely applied Lasso-like algorithms, and allows us to construct novel generalizations of Lasso-like algorithms by considering...
متن کامل